|
In statistics, ordinary least squares (OLS) or linear least squares is a method for estimating the unknown parameters in a linear regression model, with the goal of minimizing the differences between the observed responses in some arbitrary dataset and the responses predicted by the linear approximation of the data (visually this is seen as the sum of the vertical distances between each data point in the set and the corresponding point on the regression line - the smaller the differences, the better the model fits the data). The resulting estimator can be expressed by a simple formula, especially in the case of a single regressor on the right-hand side. The OLS estimator is consistent when the regressors are exogenous and there is no perfect multicollinearity, and optimal in the class of linear unbiased estimators when the errors are homoscedastic and serially uncorrelated. Under these conditions, the method of OLS provides minimum-variance mean-unbiased estimation when the errors have finite variances. Under the additional assumption that the errors be normally distributed, OLS is the maximum likelihood estimator. OLS is used in economics (econometrics), political science and electrical engineering (control theory and signal processing), among many areas of application. The Multi-fractional order estimator is an expanded version of OLS. == Linear model == (詳細はobservations . Each observation includes a scalar response ''yi'' and a vector of ''p'' predictors (or regressors) ''xi''. In a linear regression model the response variable is a linear function of the regressors: : where ''β'' is a ''p×''1 vector of unknown parameters; ''εis are unobserved scalar random variables (errors) which account for the discrepancy between the actually observed responses ''yi'' and the "predicted outcomes" ''xiTβ''; and ''T'' denotes matrix transpose, so that is the dot product between the vectors ''x'' and ''β''. This model can also be written in matrix notation as : where ''y'' and ''ε'' are ''n×''1 vectors, and ''X'' is an ''n×p'' matrix of regressors, which is also sometimes called the design matrix. As a rule, the constant term is always included in the set of regressors ''X'', say, by taking ''x''''i''1 = 1 for all . The coefficient ''β''1 corresponding to this regressor is called the ''intercept''. There may be some relationship between the regressors. For instance, the third regressor may be the square of the second regressor. In this case (assuming that the first regressor is constant) we have a quadratic model in the second regressor. But this is still considered a linear model because it is linear in the ''β''s. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Ordinary least squares」の詳細全文を読む スポンサード リンク
|